Goto

Collaborating Authors

 control signal









NeuralRule-ExecutionTrackingMachineFor Transformer-BasedTextGeneration

Neural Information Processing Systems

Sequence-to-Sequence (Seq2Seq) neural text generation models, especially the pre-trained ones (e.g., BART and T5), have exhibited compelling performance on various natural language generation tasks. However,the black-box nature of these models limits their application in tasks where specific rules (e.g., controllable constraints, prior knowledge) need to be executed.